- I tested Samsung's 98-inch 4K QLED TV, and watching Hollywood movies on it left me in awe
- Apple is working on a doorbell that unlocks your door Face ID-style
- 5 biggest Linux and open-source stories of 2024: From AI arguments to security close calls
- Securing the OT Stage: NIS2, CRA, and IEC62443 Take Center Spotlight
- Trump taps Sriram Krishnan for AI advisor role amid strategic shift in tech policy
Election 2020 – How to Spot Phony Deepfake Videos this Election | McAfee Blogs
Election 2020 – How to Spot Phony Deepfake Videos this Election
Maybe you’ve seen videos where Robert Downey Jr. and other cast members of The Avengers follow the yellow brick road after they swap faces with the cast of 1939’s The Wizard of Oz. Or how about any of the umpteen videos where the face of actor Nicolas Cage is swapped with, well, everybody, from the cast of Friends to Forrest Gump. They’re funny, uncanny, and sometimes a little too real. Welcome to deepfakes, a technology that can be entertaining, yet one that has election year implications—now and for years to come.
What are deepfakes?
Deepfakes are phoney video or audio recordings that look and sound real, so much so that the best of them can dupe people into thinking they’re the real thing. They’re not unlike those face-swapping apps your children or nieces and nephews may have on their phones, albeit more sophisticated. Less powerful versions of deepfaking software are used by the YouTube channels that create the videos I mentioned above. However, more sophisticated deepfake technologies have chilling repercussions when it comes to public figures, such as politicians.
Imagine creating a video of a public figure where you literally put words into their mouth. That’s what deepfakes effectively do. This can lead to threat tactics, intimidation, and personal image sabotage—and in an election year, the spread of disinformation.
Deepfakes sow the seeds of doubt
Deepfakes can make you question if what you’re seeing, and hearing, is actually real. In terms of an election year, they can introduce yet another layer of doubt into our discourse—leading people to believe that a political figure has said something that they’ve never said. And, conversely, giving political figures an “out” where they might decry a genuine audio or video clip as a deepfake, when in fact it is not.
The technology and security industries have responded by rolling out their own efforts to detect and uncover deepfakes. Here at McAfee, we’ve launched McAfee Deepfakes Lab, which provides traditional news and social media organizations advanced Artificial Intelligence (AI) analysis of suspected deepfake videos intended to spread reputation-damaging lies about individuals and organizations during the 2020 U.S. election season and beyond.
However, what can you do when you encounter, or think you encounter, a deepfake on the internet? Just like in my recent blog on election misinformation, a few tips on media savvy point the way.
How to spot deepfakes
While the technology continually improves, there are still typical telltale signs that a video you’re watching is a deepfake. Creators of deepfakes count on you to overlook some fine details, as the technology today largely has difficulty capturing the subtle touches of their subjects. Take a look at:
- Their face. Head movement can cause a slight glitch in the rendering of the image, particularly because the technology works best when the subject is facing toward the camera.
- Their skin. Blotchy patches, irregular skin tones, or flickering at the edges of the face are all signs of deepfake videos.
- Their eyes. Other glitches may come by way of eyeglasses, eyes that look expressionless, and eyes that appear to be looking in the wrong direction. Likewise, the light reflected in their irises may look strangely lit in a way that does not match the setting.
- Their hair. Flyaway hairs and some of the irregularities you’ll find in a person’s smile continue to be problematic for deepfakes. Instead, that head of hair could look a little too perfect.
- Their smile. Teeth don’t always render well in deepfakes, sometimes looking more like white bars instead of showing the usual irregularities we see in people’s smiles. Also, look out for inconsistencies in the lip-syncing.
Listen closely to what they’re saying, and how they’re saying it
This is important. Like I pointed out in my recent article on how to spot fake news and misinformation in your social media feed, deepfake content is meant to stir your emotions—whether that’s a sense of ridicule, derision, outrage, or flat-out anger. While an emotional response to some video you see isn’t a hard and fast indicator of a deepfake itself, it should give you a moment of pause. Listen to what’s being said. Consider its credibility. Question the motives of the producer or poster of the video. Look to additional credible sources to verify that the video is indeed real.
How the person speaks is important to consider as well. Another component of deepfake technology is audio deepfaking. As recently as 2019, fraudsters used audio deepfake technology to swindle nearly $250,000 dollars from a UK-based energy firm by mimicking the voice of its CEO over the phone. Like its video counterpart, audio deepfakes can sound uncannily real, or at least real enough to sow a seed of doubt. Characteristically, the technology has its shortcomings. Audio deepfakes can sound “off,” meaning that it can sound cold, like the normal and human emotional cues have been stripped away—or that the cadence is off, making it sound flat the way a robocall does.
As with all things this election season and beyond, watch carefully, listen critically. And always look for independent confirmation. For more information on our .GOV-HTTPS county website research, potential disinformation campaigns, other threats to our elections, and voter safety tips, please visit our Elections 2020 page: https://www.mcafee.com/enterprise/en-us/2020-elections.html
Stay Updated
To stay updated on all things McAfee and for more resources on staying secure from home, follow @McAfee_Home on Twitter, listen to our podcast Hackable?, and ‘Like’ us on Facebook.